Efficient Optimization for Sparse Gaussian Process Regression
نویسندگان
چکیده
منابع مشابه
Efficient Optimization for Sparse Gaussian Process Regression: Supplementary Material
K is the rank n full covariance matrix to be factorized, and K does not need to precomputed (taking up O(n) storage), but just need to return its diagonal and specific column when queried (a function handle for example). If σ is supplied, the algorithm below operates with an additional twist allowing the augmentation trick introduced in Sec. 3 of the paper, in which case the matrix L in the alg...
متن کاملSparse Greedy Gaussian Process Regression
Peter Bartlett RSISE Australian National University Canberra, ACT, 0200 [email protected] We present a simple sparse greedy technique to approximate the maximum a posteriori estimate of Gaussian Processes with much improved scaling behaviour in the sample size m. In particular, computational requirements are O(n2m), storage is O(nm), the cost for prediction is 0 ( n) and the cost to com...
متن کاملSparse Spectrum Gaussian Process Regression
We present a new sparse Gaussian Process (GP) model for regression. The key novel idea is to sparsify the spectral representation of the GP. This leads to a simple, practical algorithm for regression tasks. We compare the achievable trade-offs between predictive accuracy and computational requirements, and show that these are typically superior to existing state-of-the-art sparse approximations...
متن کاملEfficient Gaussian process regression for large datasets.
Gaussian processes are widely used in nonparametric regression, classification and spatiotemporal modelling, facilitated in part by a rich literature on their theoretical properties. However, one of their practical limitations is expensive computation, typically on the order of n3 where n is the number of data points, in performing the necessary matrix inversions. For large datasets, storage an...
متن کاملIncremental Variational Sparse Gaussian Process Regression
Recent work on scaling up Gaussian process regression (GPR) to large datasets has primarily focused on sparse GPR, which leverages a small set of basis functions to approximate the full Gaussian process during inference. However, the majority of these approaches are batch methods that operate on the entire training dataset at once, precluding the use of datasets that are streaming or too large ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Pattern Analysis and Machine Intelligence
سال: 2015
ISSN: 0162-8828,2160-9292
DOI: 10.1109/tpami.2015.2424873